perturbative black box variational inference
Perturbative Black Box Variational Inference
Black box variational inference (BBVI) with reparameterization gradients triggered the exploration of divergence measures other than the Kullback-Leibler (KL) divergence, such as alpha divergences. In this paper, we view BBVI with generalized divergences as a form of estimating the marginal likelihood via biased importance sampling. The choice of divergence determines a bias-variance trade-off between the tightness of a bound on the marginal likelihood (low bias) and the variance of its gradient estimators. Drawing on variational perturbation theory of statistical physics, we use these insights to construct a family of new variational bounds. Enumerated by an odd integer order $K$, this family captures the standard KL bound for $K=1$, and converges to the exact marginal likelihood as $K\to\infty$. Compared to alpha-divergences, our reparameterization gradients have a lower variance. We show in experiments on Gaussian Processes and Variational Autoencoders that the new bounds are more mass covering, and that the resulting posterior covariances are closer to the true posterior and lead to higher likelihoods on held-out data.
Reviews: Perturbative Black Box Variational Inference
Summary: The authors present a new variational objective for approximate Bayesian inference. The variational objective is nicely framed as an interpolation between classic importance sampling and the traditional ELBO-based variational inference. Properties of the variance of importance sampling estimator and ELBO estimators are studied and leveraged to create a better marginal likelihood bound with tractable variance properties. The new bound is based on a low-degree polynomial of the log-importance weight (termed the interaction energy). The traditional ELBO estimator is expressed as a first order polynomial in their more general framework.